This work presents a modular, deep learning-based system for Pomegranate Disease Detection, providing field-level decision support to farmers and agronomists. The system utilizes a mobile application (React Native/Expo) that allows end-users to capture images of pomegranate fruits or leaves and fetch GPS location. The image is processed by a Node.js/Express backend, which calls dedicated TensorFlow Lite (TFLite) models for rapid inference. We employ two separate models: one for multi-class fruit disease prediction (Alternaria, Anthracnose, Bacterial Blight, Cercospora, and Healthy) and a second for binary leaf health detection (diseased/healthy). Upon prediction, the backend integrates domain-specific cure and prevention recommendations from JSON files. The application displays predicted class, confidence, and actionable advice in a user-friendly UI. The modular architecture allows for the straightforward addition of new models or crop types. The system achieved an overall accuracy of approximately 94% on the test set
Introduction
Pomegranate cultivation is significantly affected by diseases such as Anthracnose, Bacterial Blight, Alternaria, and Cercospora, which reduce yield and quality. Traditional disease identification through manual visual inspection is often inaccurate, time-consuming, and inaccessible to farmers in rural areas, especially during early disease stages when symptoms appear similar. To address these challenges, this study presents a Pomegranate Fruit and Leaf Disease Detection System that uses deep learning and image processing to provide fast and accurate disease diagnosis.
The proposed system employs a lightweight Convolutional Neural Network (CNN) integrated into a mobile application, allowing farmers to capture images of fruits or leaves and receive instant disease predictions along with treatment recommendations. The architecture includes a React Native mobile frontend, a Node.js and Express backend, a Python-based AI inference engine using TensorFlow Lite (TFLite), and a structured knowledge base containing cure and prevention guidelines.
A custom dataset of pomegranate images covering five classes—Alternaria, Anthracnose, Bacterial Blight, Cercospora, and Healthy—was developed and used to train the CNN. Images were preprocessed and the model was optimized for mobile deployment with fewer than 0.5 million parameters. The system uses two specialized models: a multi-class classifier for fruit diseases and a binary classifier for leaf health. The trained models achieved approximately 87–94% accuracy, with fast inference times suitable for real-time field use.
Results demonstrate that the system provides reliable, rapid diagnosis (within 3–4 seconds), high accuracy, and practical usability for farmers. The multilingual mobile interface delivers clear disease identification and actionable treatment advice, reducing dependency on experts and minimizing crop losses. Overall, the system offers a scalable, efficient, and farmer-friendly solution for early pomegranate disease detection and sustainable crop management.
Conclusion
We developed a comprehensive AI-driven platform for pomegranate disease diagnosis that combines deep learning models with a mobile application. By leveraging lightweight TensorFlow Lite CNNs, our system achieves high classification accuracy (~94%) while remaining efficient enough for mobile deployment. The React Native app and Node.js backend creates a smooth user experience: farmers capture an image, send it to the server, and receive instant feedback along with scientifically validated cure and prevention advice. The end-to-end integration of image capture, on-the-fly inference, and an advisory knowledge base makes this tool a practical decision-support system for modern horticulture. In trials, the solution demonstrated stability under varying field conditions and device quality, underscoring its potential to reduce misdiagnosis and crop losses
References
[1] Shravya, R., et al., “An Efficient Crop Disease Detection using Convolution Neural Network”, International Research Journal of Engineering and Technology (IRJET), Volume: 06, Issue: 04, Apr 2019.
[2] Muthukannan, R., et al., “Plant Disease Detection Using Deep Learning”, International Journal of Recent Technology and Engineering (IJRTE), 2019.
[3] Arivazhagan, S., et al., “Detection of unhealthy region of plant leaves and classification of plant leaf diseases using texture features”, Agricultural Engineering International: CIGR Journal, Vol. 15, No. 1, 2013.
[4] Kamilaris, A., & Prenafeta-Boldú, F. X., “Deep learning in agriculture: A survey”, Computers and Electronics in Agriculture, 147, 70–90, 2018.
[5] Brahimi, M., Boukhalfa, K., & Moussaoui, A., “Deep learning for tomato diseases: Classification and symptoms visualization”, Applied Artificial Intelligence, 31(4), 299–315, 2017.
[6] Mohanty, S. P., Hughes, D. P., & Salathé, M., “Using deep learning for image-based plant disease detection”, Frontiers in Plant Science, 7, 1419, 2016.
[7] International Society for Horticultural Science (ISHS) resources on pomegranate crop diseases and management.
[8] Krizhevsky, A., Sutskever, I., & Hinton, G. E., “ImageNet classification with deep convolutional neural networks”, Advances in Neural Information Processing Systems, 2012.
[9] Simonyan, K., & Zisserman, A., “Very deep convolutional networks for large-scale image recognition”, arXiv preprint arXiv:1409.1556, 2014. (VGGNet architecture reference)
[10] Sandler, M., et al., “MobileNetV2: Inverted residuals and linear bottlenecks”, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2018.
[11] Howard, A. G., et al., “MobileNets: Efficient convolutional neural networks for mobile vision applications”, arXiv preprint arXiv:1704.04861, 2017.
[12] Agricultural Research Journals – Reference to Indian Council of Agricultural Research (ICAR) guidelines for pomegranate disease management.
[13] TensorFlow and Keras documentation – Referenced as part of the model training and deployment in the backend workflow.
[14] Expo Documentation (for React Native) – Used in the mobile frontend development for multilingual support and camera integration.
[15] If you’d like, I can consolidate and format all 14+ references into an IJRASET-style bibliography section. Let me know.